Rademacher Complexity Margin Bounds for Learning with a Large Number of Classes

نویسندگان

  • Vitaly Kuznetsov
  • Mehryar Mohri
  • Umar Syed
چکیده

This paper presents improved Rademacher complexity margin bounds that scale linearly with the number of classes as opposed to the quadratic dependence of existing Rademacher complexity margin-based learning guarantees. We further use this result to prove a novel generalization bound for multi-class classifier ensembles that depends only on the Rademacher complexity of the hypothesis classes to which the classifiers in the ensemble belong.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Rademacher Complexity of Linear Transformation Classes

Bounds are given for the empirical and expected Rademacher complexity of classes of linear transformations from a Hilbert space H to a …nite dimensional space. The results imply generalization guarantees for graph regularization and multi-task subspace learning. 1 Introduction Rademacher averages have been introduced to learning theory as an e¢ cient complexity measure for function classes, mot...

متن کامل

A Note on Extending Generalization Bounds for Binary Large-Margin Classifiers to Multiple Classes

A generic way to extend generalization bounds for binary large-margin classifiers to large-margin multi-category classifiers is presented. The simple proceeding leads to surprisingly tight bounds showing the same Õ(d) scaling in the number d of classes as state-of-the-art results. The approach is exemplified by extending a textbook bound based on Rademacher complexity, which leads to a multi-cl...

متن کامل

Rademacher complexity properties 2: finite classes and margin losses

To make this meaningful to machine learning, we need to replace Ef with some form of risk. Today will discuss three choices. 1. R` where ` is Lipschitz. We covered this last time but will recap a little. 2. Rz(f) := Pr[f(X) 6= Y ]; for this we’ll use finite classes and discuss shatter coefficients and VC dimension. 3. Rγ(f) = R`γ where `γ(z) := max{0,min{z/γ+ 1, 1}} will lead to nice bounds for...

متن کامل

On the Complexity of Linear Prediction: Risk Bounds, Margin Bounds, and Regularization

This work characterizes the generalization ability of algorithms whose predictions are linear in the input vector. To this end, we provide sharp bounds for Rademacher and Gaussian complexities of (constrained) linear classes, which directly lead to a number of generalization bounds. This derivation provides simplified proofs of a number of corollaries including: risk bounds for linear predictio...

متن کامل

Maximum Relative Margin and Data-Dependent Regularization

Leading classification methods such as support vector machines (SVMs) and their counterparts achieve strong generalization performance by maximizing the margin of separation between data classes. While the maximum margin approach has achieved promising performance, this article identifies its sensitivity to affine transformations of the data and to directions with large data spread. Maximum mar...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015